Bagging KNN Classifiers using Different Expert Fusion Strategies
نویسندگان
چکیده
An experimental evaluation of Bagging K-nearest neighbor classifiers (KNN) is performed. The goal is to investigate whether varying soft methods of aggregation would yield better results than Sum and Vote. We evaluate the performance of Sum, Product, MProduct, Minimum, Maximum, Median and Vote under varying parameters. The results over different training set sizes show minor improvement due to combining using Sum and MProduct. At very small sample size no improvement is achieved from bagging KNN classifiers. While Minimum and Maximum do not improve at almost any training set size, Vote and Median showed an improvement when larger training set sizes were tested. Reducing the number of features at large training set size improved the performance of the leading fusion strategies.
منابع مشابه
Improving reservoir rock classification in heterogeneous carbonates using boosting and bagging strategies: A case study of early Triassic carbonates of coastal Fars, south Iran
An accurate reservoir characterization is a crucial task for the development of quantitative geological models and reservoir simulation. In the present research work, a novel view is presented on the reservoir characterization using the advantages of thin section image analysis and intelligent classification algorithms. The proposed methodology comprises three main steps. First, four classes of...
متن کاملThe NTU Toolkit and Framework for High-Level Feature Detection at TRECVID 2007
In TRECVID 2007 high-level feature (HLF) detection, we extend the well-known LIBSVM and develop a toolkit specifically for HLF detection. The package shortens the learning time and provides a framework for researchers to easily conduct experiments. We efficiently and effectively aggregate detectors of training past data to achieve better performances. We propose post-processing techniques, conc...
متن کاملImproving experimental studies about ensembles of classifiers for bankruptcy prediction and credit scoring
Previous studies about ensembles of classifiers for bankruptcy prediction and credit scoring have been presented. In these studies, different ensemble schemes for complex classifiers were applied, and the best results were obtained using the Random Subspace method. The Bagging scheme was one of the ensemble methods used in the comparison. However, it was not correctly used. It is very important...
متن کاملAn experimental study on diversity for bagging and boosting with linear classifiers
In classifier combination, it is believed that diverse ensembles have a better potential for improvement on the accuracy than nondiverse ensembles. We put this hypothesis to a test for two methods for building the ensembles: Bagging and Boosting, with two linear classifier models: the nearest mean classifier and the pseudo-Fisher linear discriminant classifier. To estimate diversity, we apply n...
متن کاملApplication of ensemble learning techniques to model the atmospheric concentration of SO2
In view of pollution prediction modeling, the study adopts homogenous (random forest, bagging, and additive regression) and heterogeneous (voting) ensemble classifiers to predict the atmospheric concentration of Sulphur dioxide. For model validation, results were compared against widely known single base classifiers such as support vector machine, multilayer perceptron, linear regression and re...
متن کامل